What is @samverschueren/stream-to-observable?
The @samverschueren/stream-to-observable npm package is designed to convert Node.js streams into observables. This allows developers to use reactive programming techniques with streams, making it easier to compose asynchronous or callback-based code in a more readable and expressive way. It supports both readable and writable streams.
What are @samverschueren/stream-to-observable's main functionalities?
Converting a readable stream to an observable
This feature allows you to convert a readable stream, like one returned by `fs.createReadStream`, into an observable. You can then use RxJS operators to process the data events emitted by the stream.
const toObservable = require('@samverschueren/stream-to-observable');
const fs = require('fs');
const stream = fs.createReadStream('file.txt');
const observable = toObservable(stream);
observable.subscribe({
next(data) { console.log(data.toString()); },
error(err) { console.error('Something went wrong: ', err); },
complete() { console.log('Done reading file'); }
});
Converting a writable stream to an observable
This feature enables the conversion of a writable stream into an observable. It is particularly useful for tracking when the stream has finished writing.
const toObservable = require('@samverschueren/stream-to-observable');
const { Writable } = require('stream');
const writableStream = new Writable({
write(chunk, encoding, callback) {
console.log(chunk.toString());
callback();
}
});
const observable = toObservable(writableStream);
observable.subscribe({
complete() { console.log('Done writing to stream'); }
});
Other packages similar to @samverschueren/stream-to-observable
rxjs
RxJS is a library for reactive programming using Observables, to make it easier to compose asynchronous or callback-based code. While not a direct alternative, it provides the foundational Observable implementation that @samverschueren/stream-to-observable relies on. It offers more comprehensive features for creating and manipulating observables but does not specifically focus on converting streams to observables.
from2
from2 is a high-level module for creating readable streams that properly handle backpressure, similar to how @samverschueren/stream-to-observable works with existing streams. However, from2 focuses on creating streams from data sources rather than converting them to observables.
stream-to-promise
stream-to-promise is another utility that deals with stream completion, but instead of converting streams to observables, it converts them to promises. This is useful for async/await patterns but does not offer the continuous data handling capabilities of observables.
stream-to-observable
Convert Node Streams into ECMAScript-Observables
Observables
are rapidly gaining popularity. They have much in common with Streams, in that they both represent data that arrives over time. Most Observable implementations provide expressive methods for filtering and mutating incoming data. Methods like .map()
, .filter()
, and .forEach
behave very similarly to their Array counterparts, so using Observables can be very intuitive.
Learn more about Observables
Note: This module was forked from stream-to-observable
and released under a different name due to inactivity.
Install
$ npm install --save @samverschueren/stream-to-observable
stream-to-observable
relies on any-observable
, which will search for an available Observable implementation. You need to install one yourself:
$ npm install --save zen-observable
or
$ npm install --save rxjs
If your code relies on a specific Observable implementation, you should likely specify one using any-observable
s registration shortcuts.
Usage
const fs = require('fs');
const split = require('split');
const streamToObservable = require('@samverschueren/stream-to-observable');
const readStream = fs
.createReadStream('./hello-world.txt', {encoding: 'utf8'})
.pipe(split());
streamToObservable(readStream)
.filter(chunk => /hello/i.test(chunk))
.map(chunk => chunk.toUpperCase())
.forEach(chunk => {
console.log(chunk);
});
The split
module above will chunk the stream into individual lines. This is often very handy for text streams, as each observable event is guaranteed to be a line.
API
streamToObservable(stream, [options])
stream
Type: ReadableStream
Note:
stream
can technically be any EventEmitter
instance. By default, this module listens to the standard Stream events (data
, error
, and end
), but those are configurable via the options
parameter. If you are using this with a standard Stream, you likely won't need the options
parameter.
options
await
Type: Promise
If provided, the Observable will not "complete" until await
is resolved. If await
is rejected, the Observable will immediately emit an error
event and disconnect from the stream. This is mostly useful when attaching to the stdin
or stdout
streams of a child_process
. Those streams usually do not emit error
events, even if the underlying process exits with an error. This provides a means to reject the Observable if the child process exits with an unexpected error code.
endEvent
Type: String
or false
Default: "end"
If you are using an EventEmitter
or non-standard Stream, you can change which event signals that the Observable should be completed.
Setting this to false
will avoid listening for any end events.
Setting this to false
and providing an await
Promise will cause the Observable to resolve immediately with the await
Promise (the Observable will remove all it's data
event listeners from the stream once the Promise is resolved).
errorEvent
Type: String
or false
Default: "error"
If you are using an EventEmitter
or non-standard Stream, you can change which event signals that the Observable should be closed with an error.
Setting this to false
will avoid listening for any error events.
dataEvent
Type: String
Default: "data"
If you are using an EventEmitter
or non-standard Stream, you can change which event causes data to be emitted to the Observable.
Learn about Observables
Transform Streams
data
events on the stream will be emitted as events in the Observable. Since most native streams emit chunks
of binary data, you will likely want to use a TransformStream
to convert those chunks of binary data into an object stream. split
is just one popular TransformStream that splits streams into individual lines of text.
Caveats
It's important to note that using this module disables back-pressure controls on the stream. As such, it should not be used where back-pressure throttling is required (i.e. high volume web servers). It still has value for larger projects, as it can make unit testing streams much cleaner.
License
MIT